专利摘要:
A method of analyzing a multispectral image (10) includes constructing a detection image from values of a revealing function that quantifies a content difference of the multispectral image between two areas. The revealing function is applied between a target area and a background area, within a window that is determined around each pixel. The values of the revealing function are determined from integral images of order one, and possibly also of order two, which are themselves calculated only once initially, so that the total amount of computations is reduced. The analysis method is compatible with real-time implementation during successive multispectral image capture, especially for an environmental monitoring mission.
公开号:FR3013878A1
申请号:FR1361739
申请日:2013-11-28
公开日:2015-05-29
发明作者:Marc Bousquet;Maxime Thiebaut;Nicolas Roux;Philippe Foubert;Thierry Touati
申请人:Sagem Defense Securite SA;
IPC主号:
专利说明:

[0001] The present invention relates to a method for analyzing a multispectral image, as well as a computer program for implementing such a method. It also relates to a method and a device for monitoring an environment.
[0002] Monitoring an environment is a common task, especially to detect enemy intrusions. Such monitoring presents particular difficulties when carried out in the terrestrial environment. Indeed, a terrestrial environment such as a countryside landscape can contain a large number of distinct elements with irregular contours such as trees, bushes, rocks, buildings, etc., which make the interpretation of image and search for intruding elements. In addition, under certain circumstances, such as military surveillance, an intruder may be camouflaged to make detection in the landscape more difficult. Commonly, such a camouflage is effective against observation in visible light, especially for wavelengths which are between 0.45 μm (micrometer) and 0.65 μm, and especially around 0.57 μm which corresponds to the maximum sensitivity of the human eye. To successfully detect the intruder element, which is also called target in the jargon of the skilled person, despite a complex landscape and a possible camouflage of the target, it is known to perform a multispectral observation of the environment. Such a multispectral observation consists in simultaneously capturing several images of the same landscape according to different spectral bands, so that a target which does not appear distinctly in the images captured according to certain spectral bands is revealed by the images corresponding to other spectral bands. . Each spectral band can be narrow, with an interval of wavelength values of a few tens of nanometers, or wider or even very wide with a width of several micrometers, especially when the spectral band is located in one of the domains infrared: between 3 pm and 5 pm - 2 - or between 8 pm and 12 pm. It is thus known that an observation in the wavelength range between 0.8 μm and 1.2 μm can be effective in revealing a target in a vegetation environment, while the target is effectively camouflaged against a detection by observation in the visible light range for the human eye. However, such multispectral detection may still be insufficient to allow an operator who is in charge of surveillance to detect the presence of a target in a terrestrial environment. Indeed, in some circumstances, none of the images that are separately associated with the spectral bands show the target sufficiently distinctly for the surveillance operator to detect the target in these images, given the observation time allocated. In the following, each image that corresponds separately to one of the spectral bands is called the spectral image. For such situations, it is still known to improve the efficiency of target detection by presenting to the operator an image that is constructed by Fisher projection. Such a method is known in particular from the article "Some practical issues in anomalous detection and exploitation of regions of interest in hyperspectral images", F. Goudail et al., Applied Optics, Vol. 45, No. 21, pp. 5223-5236. According to this method, the image that is presented to the operator is constructed by combining at each point thereof, called a pixel, the intensities values that are seized separately for several spectral bands, so as to optimize a contrast of the resulting image. Theoretically, this image construction consists in projecting for each pixel the vector of the intensities that have been captured for the selected spectral bands, on an optimal direction in the multidimensional space of the spectral intensity values. This optimal projection direction can be determined from the covariance matrix of the spectral intensities, estimated over the entire image field. This is in fact to seek a maximum correlation between the variations of intensities that are present in the different images captured according to the selected spectral bands. The contrast of the image that is presented to the operator is thus at least equal to that of each separate spectral image, so that the target detection by the operator is both more efficient and more reliable. Alternatively, the optimal projection direction can be sought directly using a usual optimization algorithm, to maximize the image contrast by varying the projection direction in the multidimensional space of the spectral intensities.
[0003] It is known to improve the Fisher projection contrast within a spatial window that is smaller than the entire image. The surveillance operator then selects the window inside the entire image, including its position, depending on the nature of the environment there in the image, and its desire to accentuate the image. search for a potential target. To facilitate such a selection of the position of the window, and also to facilitate the identification and the nature of an intrusion that would occur in this area, it is still known to present the operator a composite image on a screen. Such a composite image may consist of one of the spectral images outside the window, and the portion of multispectral image that results from the projection of Fisher inside the window. Alternatively, several spectral images whose wavelength ranges are contained within the sensitivity range of the human eye, can be used to display a landscape representation in natural or near-natural colors outside the window. But in the composite image that is presented to the operator, the enhanced contrast that is provided by the Fisher projection is restricted inside the window. Because of this restriction, the monitoring operator does not have a visualization of the whole field of view with a stronger contrast. It is then not able to quickly become aware of the extent of an enemy intrusion that is camouflaged, because of the time that is necessary to scan the entire field of observation with windows that are selected and processed successively. From this situation, a general purpose of the invention is then to make reliable the monitoring of an environment from a multispectral image.
[0004] In other words, the general object of the invention is to further reduce the probability of non-detection of an intruder element in a scene that is captured in a multispectral image. More particularly, the purpose of the invention is to provide the surveillance operator with an image of the field of view in real time, in which the contrast is reinforced at every point of this image as a function of the information which is contained in the multispectral image. In other words, the invention aims to provide the operator with an image of the field of view which is improved in its entirety, easily interpretable, and can be produced with a very short calculation time. Preferably, this duration is compatible with the real-time reception of a video stream from a camera used to continuously capture successive multispectral images.
[0005] To achieve some of these or other goals, a first aspect of the invention provides a novel method of analyzing a multispectral image, when said multispectral image comprises a plurality of spectral images of the same scene but corresponding to spectral intervals. different, each spectral image assigning an intensity value to each pixel, or image point, which is located at an intersection of a row and a column of a matrix of the multispectral image, and a dot of origin being defined at a peripheral boundary angle of the array. According to the invention, a detection image is constructed by assigning a display value to each pixel of a useful area of the array, which display value is derived from a revealing function that quantifies a content gap. of the multispectral image between two zones that are determined by this pixel. For this, the method comprises the following steps: / 1 / compute integral images of at least one of the following types: - for each spectral image, an integral image of order one by attributing to each calculation pixel an integral value equal to a sum of the intensity values of this spectral image for all the pixels that are contained in a rectangle having two opposing vertices located respectively on the origin point and on the calculation pixel; for each pair of spectral images coming from the multispectral image, an integral image of order two by attributing to each calculation pixel another integral value equal to one sum, for all the pixels which are contained in the rectangle having two opposing vertices located respectively on the origin point and on the calculation pixel, products of the two intensity values relating to the same pixel but respectively allocated by each spectral image of the pair; / 2 / defining a fixed window frame and a mask internal to the frame, this mask defining a target area and a bottom area inside the frame; and / 3 / for each pixel of the useful area of the matrix, perform the following substeps: / 3-1 / set a window to a position in the matrix that is determined by the pixel, and the window being limited by the framework defined in step / 2 /; / 3-2 / using integral values read in the integral images calculated in step / 1 /, determine a value of the revealing function that quantifies the content difference of the multispectral image between the target area and the bottom area inside the window in the position determined by the pixel; and / 3-3 / assign the value of the revealing function, calculated in substep / 3-2 /, to the pixel for which step / 3 / is executed. The values of the revealing function are then used to construct the detection image, pixel by pixel in the useful area of the matrix. Thus, a first feature of the invention consists in presenting to the surveillance operator an image which is entirely composed from the values of the revealing function, resulting from a local processing of the multispectral image. This image, called the detection image, is homogeneous in its nature and its method of calculation. A second feature of the invention is to use a method of calculating the revealing function, which is based on integral images. In a first step, the spectral images are converted into first and / or second integral images, and then the imaging function is calculated from these integral images. Thus, the sums of intensity values which are computed over large numbers of pixels, are performed only once, then the results of these sums are read and combined to obtain the value of the revealing function for each pixel.
[0006] With this structure of the analysis method, it can be performed very quickly without requiring significant computation means. In particular, the analysis method of the invention is compatible with real-time reception of a video stream from a camera that is used to capture multispectral images continuously.
[0007] Furthermore, the use of a window which is smaller than the matrix, to calculate the display value of each pixel in the detection image, allows this detection image to be easily interpretable visually by the operator of the image. The surveillance. In other words, most of the patterns that are contained in the detection image can be easily recognized by the operator. Preferably, the window frame may have dimensions that are between one fifth and one fiftieth of those of the matrix parallel to the rows and columns. To increase a contrast of the detection image, the mask can be advantageously defined in step / 2 / so that the target zone and the bottom zone are separated by an intermediate zone inside the window frame . In possible implementations of the invention, the display value can be obtained from the value of the revealing function for each pixel of the detection image, using one of the following methods or a combination of these: - compare the value of the revealing function with a threshold, and the display value is taken equal to zero if this value of the revealing function is lower than the threshold, otherwise the display value is taken equal to the value revealing function; or - apply a linear scale conversion to the value of the revealing function, and the display value is taken equal to a result of this conversion. In simple implementations of the invention, the step / 1 / may comprise the computation of the integral images of order one, and the substep / 3-2 / may comprise itself: / 3 -2-1 / determining two mean vectors, respectively for the target zone and the bottom zone, each having a coordinate for each spectral image, which is equal to an average of the intensity values of this spectral image, calculated for the pixels of the target area or the background area, respectively; then / 3-2-2 / calculate the value of the revealing function from these two mean vectors. In particular, the revealing function can be an angle between the two mean vectors in the multidimensional space whose axes are the intensities for each spectral image. More generally, the revealing function may depend on this angle between the two mean vectors.
[0008] More generally, the method of the invention can use only the integral images of order one. In this case, the second order integral images are not necessarily calculated in step / 1 /. However, the step / 1 / may comprise the computation of the first order and also second order integral images, so that the value of the revealing function can be calculated in the substep / 3-2-2 / using integral values that are read in second order integral images, in addition to those read in the first order integral images. Optionally also, the method of the invention can use only the second order integral images. In such another case, the first order integral images are not necessarily calculated in step / 1 /. In particular embodiments of the invention, substep / 32 / may comprise a determination of a Fisher factor, in the form of a vector associated with a Fisher projection that increases a contrast of the multispectral image. in the window between the target area and the background area.
[0009] In this case, the Fisher factor can itself be calculated in the sub-step / 3 2 / for each pixel of the useful area, from the integral values read in the integral images of order one and two. The execution of the analysis method of the invention is thus even faster. In particular, the Fisher factor can be calculated in step / 3-2 / for each pixel of the useful area, in the form of a product between, on the one hand, a line vector which results from a difference between the average vectors calculated for the target area and the bottom area, and, on the other hand, an inverted covariance matrix. The covariance matrix considered has a factor for each pair of spectral images from the multispectral image, which is equal to a covariance of the spectral intensity values assigned respectively by each spectral image of the couple, calculated for the pixels of the zone background. Optionally, the substep / 3-2 / may further comprise a calculation of two Fisher mean values, respectively for the target area and the bottom area, which are each equal to a dot product result between the Fisher and the mean vector for the target area or for the bottom area, respectively. In this case, the value of the revealing function may depend on a difference between the two Fisher mean values. A second aspect of the invention provides a support readable by one or more processors, which comprises codes written on this support and adapted to control the processor (s), an execution of an analysis method according to the first aspect. of the invention. This second aspect of the invention is therefore a computer program which has a nature of commercial product or recoverable in any form. A third aspect of the invention provides a method of monitoring an environment, which comprises the steps of: - simultaneously capturing a plurality of spectral images of the environment, so as to obtain a multispectral image; analyzing the multispectral image using an analysis method according to the first aspect of the invention; and - display the detection image on a screen, for a monitoring operator who observes screen. The monitoring method may further include comparing the display value of each pixel in the detection image with an alert threshold. Then, a pixel may further be displayed in this detection image with a color that is changed, flashed or overlaid if its display value is greater than the alert threshold.
[0010] The attention of the surveillance operator is thus drawn even more to this place in the field of observation, to determine if an intruder element is present there. Finally, a fourth aspect of the invention proposes a device for monitoring an environment, which comprises: means for storing a multispectral image formed of several spectral images of the same scene, which are associated with separate spectral intervals; a screen comprising pixels which are located respectively at intersections of rows and columns of a matrix; an image processing system, which is adapted to calculate integral images from the spectral images, and to store these integral images; means for defining a window frame and a mask internal to the frame, the mask defining a target area and a bottom area inside the frame; and a control system which is adapted to implement step / 3 / of an analysis method according to the first aspect of the invention, and to display a detection image on the screen, in which each The pixel of a useful area of the matrix has a display value which is obtained from the value of the developer function calculated for that pixel. Other features and advantages of the present invention will appear in the following description of nonlimiting examples of implementation, with reference to the accompanying drawings, in which: - Figure 1 is a schematic representation of a multispectral image; FIG. 2 represents a display screen used to implement the invention; FIGS. 3a and 3b illustrate principles of integral image construction used to implement the invention; FIG. 4 represents a mask that can be used to calculate a contrast inside a window, in particular modes of implementation of the invention; FIG. 5 is a block diagram of the steps of a method according to the invention; FIGS. 6a and 6b show sequences of calculations carried out in certain steps of the method of FIG. 5, for two different modes of implementation of the method; Figure 7 illustrates a mean vector calculation principle that is used in the present invention; and FIG. 8 illustrates the principle of a Fisher projection in a two-dimensional space of spectral intensities. For the sake of clarity, the dimensions of the elements which are represented in some of these figures correspond neither to real dimensions nor to real size ratios. In addition, identical references which are indicated in different figures designate identical elements or which have identical functions. The reference numeral 10 in FIG. 1 globally designates a multispectral image formed of several individual images 11, 12, 13, ... that have been simultaneously captured for the same scene. In other words, the individual images 11, 12, 13, ... have been captured by imaging channels arranged in parallel, activated at the same time and having the same input optical field. However, each image 11, 12, 13, ... has been captured by selecting a part of the radiation from the scene, which is separated from each part of the radiation used for another of the images 11, 12, 13, ... This separation is carried out according to the wavelength λ of the radiation, in one of the ways known to those skilled in the art, so that each of the images 11, 12, 13,. spectral, has been captured with the radiation whose wavelength belongs to a distinct interval, preferably without overlap with each of the intervals of the other spectral images. The number of spectral images 11, 12, 13,... May be arbitrary, for example equal to twenty spectral images, each associated with a wavelength interval whose width may be between a few nanometers and a few tens of nanometers. , or more. Such a multispectral image can also be called hyperspectral, depending on the number of spectral images that compose it and the width of each of their wavelength ranges. For example, A1, A2, A3, ... denote central values for the respective wavelength intervals of the spectral images 11, 12, 13, ... Depending on the application of the invention, these intervals may be between 400 nm (nm) and 1 pm (micrometer), or between 400 nm and 2.5 pm, for example.
[0011] Each spectral image 11, 12, 13, ... can be processed for the invention from a file that is read on a storage medium, or from a digital stream that is produced by a pickup instrument. views at a video rate. Depending on the case, the image data may be raw data produced by one or more image sensors, or data already processed for certain operations such as the reframing of the spectral images with respect to each other, a correction of - or under-exposure, etc. FIG. 2 shows a display screen for the multispectral image 10. It comprises a matrix 1 of image points 2, or pixels 2, which are arranged at intersections of rows and columns of the matrix. For example, the matrix 1 may have 500 columns and 500 rows of pixels 2. x and y respectively denote the direction of the rows and that of the columns. An origin point O is set at an angle of the peripheral boundary of the matrix 1, for example at the top left. Each of the spectral images 11, 12, 13, ... separately assigns an intensity value to each pixel 2 of the matrix 1.
[0012] Such an attribution is direct if each spectral image is directly input according to the matrix 1, or can be indirect in the case where at least some of the spectral images 11, 12, 13, ... are inputted according to a different matrix. In this case, an intensity value for each pixel 2 of the matrix 1 can be obtained by interpolation, for each spectral image whose initial matrix is different from the matrix 1. An integral image is calculated separately for each image. spectral 11, 12, 13, ..., according to a common calculation principle which is known to those skilled in the art. Reference 111 in Fig. 3a denotes the integral image which is calculated from the spectral image 11. The value of the pixel P in the integral image 111 is calculated as the sum (sign E in Fig. 3a) of all the intensity values in the image 11, for the pixels 2 of the matrix 1 which are contained in a rectangular zone R whose two opposite angles are the origin point O and the pixel P itself. The integral images thus calculated have been called first order integral images in the general description of the invention, and the values of each of them for the pixels 2 of the matrix 1 have been called integral values. FIG. 3b illustrates the principle of calculation of the second order integral image which is obtained from the spectral images 11 and 12. This integral image is indicated by the reference (11x12) 1. Its integral value for the pixel P is calculated as follows: for each pixel 2 of the rectangular zone R whose opposite angles are the origin point O and the pixel P itself, the product of the two intensity values which are assigned to this pixel 2 by the spectral images 11 and 12, is calculated (product operator in Figure 3b). Then these products are added together for area R, and the result of the sum is the integral value of the integral image (11x12) I at pixel 2. Second order integral images are calculated in a similar way for all couples possible spectral images 11, 12, 13, ..., including couples whose two spectral images are the same. Of course, couples whose two images are identical from one couple to another but selected in the reverse order correspond to identical two-dimensional integral images, so that they are calculated only once. times.
[0013] All first and second order integral images are stored or stored (step 51 of FIG. 5), so that the integral values can be read quickly in the further execution of the method. Windows F1, F2, F3,... Are then successively selected inside the matrix 1, to form a scan of the matrix (FIG. 2). Each window is defined by a frame whose dimensions are preferably fixed, and a position of placement of the frame in the matrix 1.
[0014] For example, the window F1 can be placed against the corner end at the top left of the matrix 1, then the window F2 is obtained by shifting the window frame of a column to the right with respect to the window F1, etc., until the window frame abuts against the right edge of the matrix 1. The scanning can then be continued by returning to the left edge of the matrix 1, but a line of pixels lower, etc. By associating each placement of the frame with the pixel 2 of the matrix 1 which is located at the center of the window thus formed, a useful zone marked ZU is progressively scanned by the successive window centers, excluding a complementary zone ZC which results from edge effects. The zone ZU corresponds to the extent of the detection image which is constructed according to the invention. A fixed mask is also defined inside the window frame, and is transposed with the latter at each position of the frame in the matrix 1 when a different window is selected. This mask defines a target zone denoted T and a bottom zone denoted B inside the window frame. For example, as shown in FIG. 4, the target zone T and the bottom zone B may have square respective boundaries, be concentric and separated by an intermediate zone denoted J. nxB and nyB denote the external dimensions of the zone of bottom B according to the x and y directions. They correspond to the dimensions of the window frame, for example each equal to 51 pixels. nx-r and ny-r denote the dimensions of the target zone T in the directions x and y, for example each equal to 7 pixels, and nxj and nyj denote the external dimensions of the intermediate zone J, for example each equal to 31 pixels. The values of nxB and nyB, nXT and ny-r, nxJ and nyJ can be selected based on an assumption about the size of an intruder element that is sought in the imagery scene, and its assumed distance away. The construction of the detection image is now described with reference to FIG. 5 and FIGS. 6a and 6b, respectively for two different implementations of the invention. For each of these implementations separately, the same sequence of steps, which is described for the window F1, is repeated for each pixel 2 of the useful area ZU. The intensity values of each pixel 2 of the target area T and the bottom area B, in the window F1, are considered for the d spectral images 11, 12, 13, ... which together constitute the image multispectral 10, d being an integer greater than or equal to two. A vector of spectral intensities is then constructed for the target zone T in the following way: it has a separate coordinate for each of the spectral images, and this coordinate is equal to an average of the intensity values of all the pixels of the spectral image. target area T in this spectral image. Thus, the following vector can be constructed: Xi (i) IET 1 X2 (i) = jET T NT Xcl (i) jET where i is an index which numbers the pixels of the target area T, xi (i), x2 (i), ..., xd (i) are the intensity values of the pixel i respectively in the spectral images 11, 12, 13, ..., and NT is the number of pixels of the target area T. In a space of intensities for the spectral images, the vector mT corresponds to an average position of the spectral intensity vectors of all the pixels of the target zone T. In other words, mT is the average vector of the spectral intensities for the zone In known manner, each coordinate of the vector mT can be calculated directly from the corresponding integral image of order one, as follows which is illustrated in FIG. 7: xk (i) = Imintk (A ) + Imintk (C) - Imintk (B) - Imintk (D) jET where k is the index of the coordinate of the vector mT, less than or equal to d, A, B, C and D are the pixels of the vertices of the window F, lm intk (A) is the integral value at the pixel A which is read in the integral image k of order one, and so on. in the same way for the pixels B, C and D. - Another vector of spectral intensities is constructed analogously for the bottom area B: Ixi (i) IEB 1 IX2 (i) mB 1E13 1E13 B NB IX (i) IEB Where NB is the number of pixels in the background area B. In general, it is different from the NT number, but not necessarily. The vector mB corresponds in the same way to an average position of the vectors of spectral intensities of all the pixels of the bottom area B. It is called average vector of the spectral intensities for the bottom area B. The vector mB can also be calculated from the integral images of order one, in a way that is adapted to the shape of the bottom area B but is easily accessible to the skilled person. The vectors mT and mB are arranged in column and each have coordinates. Figure 8 shows the vectors mT and mB in the case of two spectral images (d = 2). In this figure, the concentric ellipses which are referenced T and B symbolize level curves associated with constant values for numbers of pixels in the target zone T and in the bottom zone B. The two implementations of FIGS. and 6b use the mean vectors mT and mB, which are calculated in step S2. REV denotes the revealing function that is used to build the detection image. Its value for the pixel of the detection image being calculated is determined in step S4. For the implementation of FIG. 6a, the revealing function REV is equal to the separation angle that exists between the two mean vectors mT and mB in the space of the spectral intensities. It can be calculated using the following formula: (mT - mB IlfilT11 '1141B11 where Arccos denotes the inverse function of the cosine, - the numerator denotes the product scaliare between the two mean vectors mT and mB, and the double vertical bars denote the norm The value of the REV = Arc cos -16- function REV as it results from this embodiment can be expressed in any of the usual angle units. the invention which is illustrated by FIG. 6b is based on the search for a maximum contrast existing between the contents of the multispectral image 10 which are situated in the target zone T and in the bottom zone B. This maximum value contrast is that which results from the application of the Fisher projection to the multispectral content of the window F1.For this, the Fisher projection is itself determined first, and can be determined using one of the methods already known. However, the method is now described is preferred because it exploits the full images that have already been calculated. In addition to the mean vectors mT and mB, the step S2 further comprises calculating the following covariances matrix, from the intensity values of the pixels 2 of the bottom area B: Co varB = Var (xi, x1 ) Co var (x2, x1) - - - Co var (xd, x1) Co var (xd, x2) Co var (xi, x2) Var (x2, x2) - - - Var (xd, xd) Co var ( x 1, xd) Co var (x2, xd) - - - where Var (x1, xi) denotes the variance of the intensity values of the spectral image 11 calculated on the pixels of the bottom area B, Covar (x1) , x2) denotes the covariance of the respective intensity values of the spectral images 11 and 12 calculated on the pixels of the background area B, etc. for all pairs of index values selected from 1, 2, ... d. The CovarB matrix is square of dimension d. It can be determined in a manner that is still known to those skilled in the art, from the second order integral images and the components of the vector mB.
[0015] The Fisher PFisher projection can also be determined in this step S2, in the following way: PFisher = (mT mB) - Co varB where the exponent t denotes the transpose of a vector, the exponent -1 denotes the inverse of a matrix, and - designates the matrix product operation, here applied between the on-line vector (mr-mB) t and the CovarB-1 matrix. The Fisher projection thus expressed is an online vector with coordinates. Usually, it is intended to be applied to the vector of the intensity values of each pixel of the window F1 in the form of the following matrix product: PFisher (i) PFisher where j denotes any pixel in the window F1, and PFisher (j ) is the intensity value for the pixel j that results from the Fisher projection applied to the vector of intensity values for this same pixel j in the d spectral images. In the jargon of those skilled in the art, the PFisher projection that is determined for the selected window is called the Fisher factor, and the set of PFisher intensity values (j) that is obtained for all the pixels j of the window. is called Fisher's matrix. Figure 8 also shows this Fisher projection in the spectral intensity space, for the example case of two spectral images: d = 2. The notations that were introduced above are included in this figure. The Fisher matrix that is thus obtained for the window F1 is the representation of the content of the multispectral image 10 inside the window F1, which has the maximum contrast. Dm-r_mg is the direction on which the spectral intensity vectors of the pixels j are projected according to the Fisher method. The invention is based on the use of the maximum contrast that is present inside the Fisher matrix, but without it being necessary to calculate this matrix itself. A significant gain in calculation time results, which is one of the advantages of the invention.
[0016] The average of the intensity values of the Fisher matrix inside the target zone T is, for the conserved window: inFT = - PFisher (i) NT Similarly, the average of the intensity values of the matrix Fisher's inside the bottom zone B is, for the same window: mFB_-Pr her (i) NB B The inventors have discovered that the averages mFT and mFB can be expressed in the following ways: the mean mFT is equal to the Fisher factor applied to the mean vector of the spectral intensities for the target area T: mFT = PFisher'MT, where - denotes the scalar product operation between the PFisher-line vector and the column vector of mT; and - similarly, the mFB average is equal to the Fisher factor applied to the average vector of the spectral intensities for the target area B: mFB = PFishergrIB- Step S3 in Figures 5 and 6b consists of calculating the averages mFT and mFB in this way. Figure 8 also shows these mean values of Fisher mFT and mFB along the DmT-mB direction.
[0017] Then, in step S4, the revealing function REV depends on the difference mFT-mFB. Typically, REV can be a monotonic, continuous, increasing or decreasing function, with respect to the mFT-mFB difference, but it can also be a monotonic increasing or decreasing function in steps. For all the embodiments of the invention, the value of the revealing function REV which has been calculated for a position of the window in the matrix 1, is attributed to the pixel 2 of the useful area ZU which determined this position. In the implementation modes taken as examples, it is assigned to the pixel 2 on which the centered window is. It is recalled that this value of the revealing function has been calculated for each pixel 2 of the useful area ZU, solely from the first and / or second order integral images. These integral images, themselves, have been previously calculated only once, in step S1. An economy of calculations is obtained in this way, which is considerable and thanks to which the method of the invention is compatible with a real-time execution as successive multispectral images are captured, received in a video streams or read on a recording medium. Optionally, the values of the revealing function REV may themselves be processed in step S5, in particular with the aim of making the detection image even easier to understand for the surveillance operator. Such processing can consist in comparing each value of the function REV with a threshold, or several thresholds, and modifying this value according to the result of the comparison. For example, the value of the function REV for any one of the pixels of the useful area ZU can be reduced to zero when it is initially lower than a first threshold. Simultaneously, the value REV can be increased to a maximum value when it exceeds a second threshold, so that the pixel concerned appears more clearly in the detection image. The attention of the surveillance operator can thus be drawn more strongly to this location of the detection image. The first and / or second threshold may be fixed, or determined according to a statistical study of all the values of the REV function that have been obtained for all the pixels 2 of the useful area ZU. For example, each threshold can be calculated according to an average value of the function REV, calculated on the pixels 2 of the useful area ZU. The processing of the values of the function REV can also include a linear scale conversion, in particular to make the amplitude of variation of the function REV coincide on the useful area ZU, with a magnitude of intensity of display of the pixels in the field. detection image. Thus, the null value for the display intensity can be attributed to the minimum value that has been obtained for the REV function in the useful area ZU, and the maximum value of the display intensity can be assigned simultaneously to the value maximum reached by the REV function in the useful area ZU. A display intensity value then results from such a linear conversion, for each value of the intermediate REV function between the two minimum and maximum values. Possibly, thresholding and scale conversion processes can be combined.
[0018] The detection image is then constructed by assigning a display value to each pixel 2 in the useful area ZU of the matrix 1. This display value can be directly equal to the value of the revealing function REV which has been obtained for this pixel in step S4, or result from this value of the REV function following the processing of step S5. The detection image is then displayed on the screen in step S6, to be observed by the monitoring operator. Optionally, this detection image can be displayed alternately with a representation of the monitoring field that reproduces a visual perception by the human eye, that is to say a perception from light that belongs to the visible range of length. wave and which comes from the field of view. Thus, an identification of the operator in the surveillance field is facilitated, to locate elements that are revealed by the detection image while these elements are invisible directly to the human eye. Optionally, various effects may be added to the detection image to further highlight areas of the detection image where the revealing function has produced display values that are above an alert threshold. These effects can be a color display of the pixels concerned and / or a blinking thereof and / or the addition of an overlay. The warning threshold can be predefined or derived from a statistical study of the values of the revealing function or the display values in the useful area ZU. It is understood that the invention may be reproduced by changing minor aspects of the embodiments which have been described in detail above, while retaining the main advantages which have been mentioned and which are still recalled: A detection image that is coherent and has a homogeneous construction mode is presented to the surveillance operator to reveal camouflaged elements present in the field of view; these uncovered elements can be recognized by the operator according to the detection image; and the detection image can be calculated and presented to the operator in real time during the surveillance mission.
权利要求:
Claims (15)
[0001]
REVENDICATIONS1. A method of analyzing a multispectral image (10), said multispectral image comprising a plurality of spectral images (11, 12, 13, ...) of the same scene but corresponding to different spectral intervals, each spectral image assigning a value of intensity at each pixel (2) which is located at an intersection of a row and a column of a matrix (1) of the multispectral image, and a point of origin (0) being defined at a peripheral limit angle of the matrix; wherein a detection image is constructed by assigning a display value to each pixel of a usable area (ZU) of the array, said display value being derived from a revealing function which quantizes a difference of content of the multispectral image between two zones determined by said pixel; the method comprising the following steps: / 1 / calculating integral images of at least one of the following types: - for each spectral image, an integral image of order one by assigning to each calculation pixel an integral value equal to one sum of the intensity values of said spectral image for all the pixels contained in a rectangle having two opposing vertices located respectively on the origin point and on said calculation pixel; for each pair of spectral images coming from the multispectral image, an integral image of order two by attributing to each calculation pixel another integral value equal to a sum, for all the pixels contained in the rectangle having two opposing vertices located respectively on the origin point and on said calculation pixel, products of the two intensity values relating to the same pixel but respectively allocated by each spectral image of the pair; - 22 - / 2 / defining a fixed window frame and a mask internal to the frame, said mask defining a target area and a bottom area inside said frame; and / 3 / for each pixel (2) of the useful area (ZU) of the matrix (1), performing the following substeps: / 3-1 / placing a window at a position in the matrix which is determined by the pixel, said window being limited by the frame defined in step / 2 /; / 3-2 / using integral values read in the integral images calculated in step / 1 /, determine a value of the revealing function that quantifies the content difference of the multispectral image between the target area and the bottom area within the window in the position determined by said pixel; and / 3-3 / assign the value of the revealing function, calculated in substep / 3-2 /, to the pixel for which step / 3 / is executed.
[0002]
2. Analysis method according to claim 1, wherein the window frame has dimensions between one-fifth and one-fiftieth of the dimensions of the matrix parallel to the rows and columns.
[0003]
3. Analysis method according to claim 1 or 2, wherein the mask is defined in step / 2 / so that the target zone and the bottom zone are separated by an intermediate zone inside the frame. of window.
[0004]
A method according to any one of the preceding claims, wherein the display value is obtained from the value of the revealing function for each pixel of the detection image, using one of the following methods or a combination of said methods: - comparing the value of the revealing function with a threshold, and the display value is taken equal to zero if said value of the revealing function is below the threshold, otherwise said display value is taken equal to the value of the revealing function; or - apply a linear scale conversion to the value of the revealing function, and the display value is taken equal to a result of the conversion.
[0005]
5. Analysis method according to any one of the preceding claims, wherein the step / 1 / comprises the calculation of the integral images of order one, and the substep / 3-2 / comprises itself: / 3-2-1 / using integral values read in first-order integral images, determine two mean vectors, respectively for the target area and the background area, each having a coordinate for each spectral image, which is equal to an average of the intensity values of said spectral image, calculated for the pixels of the target area or the background area, respectively; then / 3-2-2 / calculate the value of the revealing function from said two mean vectors.
[0006]
6. Analysis method according to claim 5, wherein the revealing function is an angle between the two mean vectors in a multidimensional space whose axes are the intensities for each spectral image, or depends on said angle.
[0007]
The method of any one of the preceding claims, wherein step / 1 / comprises calculating first and second order integral images, and in step / 3-2-2 /, the value of the revealing function is calculated using integral values read in the second order integral images, in addition to integral values read in the first order integral images.
[0008]
8. An analysis method according to any one of the preceding claims, wherein the substep / 3-2 / comprises a determination of a Fisher factor, in the form of a vector associated with a projection. Fisher that increases a contrast of the multispectral image in the window between the target area and the background area.
[0009]
9. An analysis method according to claims 7 and 8, wherein the Fisher factor is itself calculated in the substep / 3-2 / for each pixel of the useful area, from the integral values read in the integral images of order one and two.
[0010]
10. An analysis method according to claims 5 and 9, wherein the Fisher factor is calculated in step / 3-2 / for each pixel of the useful area, in the form of a product between, on the one hand a line vector resulting from a difference between the average vectors calculated for the target area and the bottom area, and, on the other hand, an inverted covariance matrix, said covariance matrix having a factor for each pair spectral image from the multispectral image, which is equal to a covariance of the spectral intensity values assigned respectively by each spectral image of the couple, calculated for the pixels of the background area.
[0011]
The method of claim 5 and any one of claims 8 to 10, wherein the substep / 3-2 / comprises a calculation of two mean values of Fisher, respectively for the target area and the area of background, each equal to a scalar product result between the Fisher factor and the mean vector for the target area or for the background area, respectively; and the value of the revealing function depends on a difference between said two mean values of Fisher.
[0012]
12. Computer program product, comprising a medium readable by one or more processors, and codes written on said medium and adapted to control one or more processors, an execution of a method of analysis in accordance with any one of Claims 1 to 11.
[0013]
13. A method of monitoring an environment, comprising the steps of: simultaneously capturing a plurality of spectral images of the environment, so as to obtain a multispectral image; analyzing the multispectral image using an analysis method according to any one of claims 1 to 11; and - display the detection image on a screen, for a monitoring operator who observes screen.
[0014]
The monitoring method of claim 13, further comprising comparing the display value of each pixel in the detection image with an alert threshold, and wherein a pixel is further displayed in the image. detection with a modified color, a flashing or an overlay if the display value of said pixel is greater than the alert threshold.
[0015]
15. Device for monitoring an environment, comprising: means for storing a multispectral image (10) formed of several spectral images (11, 12, 13, ...) of the same scene, which are associated with separate spectral intervals; a screen comprising pixels (2) located respectively at intersections of rows and columns of a matrix (1); an image processing system adapted to calculate integral images from the spectral images (11, 12, 13, ...), and to store said integral images; means for defining a window frame and a mask internal to the frame, said mask defining a target area and a bottom area inside said frame; and a control system adapted to implement step / 3 / of an analysis method which is according to any one of claims 1 to 11, and to display a detection image on the screen, wherein each pixel (2) of a useful area (ZU) of the matrix (1) has a display value which is obtained from the value of the developer function calculated for said pixel.
类似技术:
公开号 | 公开日 | 专利标题
FR3013878A1|2015-05-29|ANALYSIS OF A MULTISPECTRAL IMAGE
EP3074921B1|2018-01-31|Analysis of a multispectral image
Cheng et al.2014|Cloud removal for remotely sensed images by similar pixel replacement guided with a spatio-temporal MRF model
EP3114831B1|2021-06-09|Optimised video denoising for heterogeneous multisensor system
EP3055832B1|2018-01-31|Method for viewing a multi-spectral image
FR2982394A1|2013-05-10|SEARCH FOR A TARGET IN A MULTISPECTRAL IMAGE
EP1656650B1|2008-03-05|Method and system for detecting a body in a zone located proximate an interface
EP3200153B1|2018-05-23|Method for detecting targets on the ground and in motion, in a video stream acquired with an airborne camera
EP2555161A1|2013-02-06|Method and device for calculating a depth map from a single image
FR3028988A1|2016-05-27|METHOD AND APPARATUS FOR REAL-TIME ADAPTIVE FILTERING OF BURNED DISPARITY OR DEPTH IMAGES
EP2909671B1|2016-12-07|Method for designing a single-path imager able to estimate the depth of field
Sargent et al.2020|Conditional generative adversarial network demosaicing strategy for division of focal plane polarimeters
EP3216213B1|2020-12-30|Method for detecting defective pixels
Kanaev2015|Compact full-motion video hyperspectral cameras: development, image processing, and applications
EP3591610B1|2021-01-27|Method for generating a multispectral image
Nielsen et al.2014|Tree cover mapping for assessing greater sage-grouse habitat in eastern Oregon
BE1021546B1|2015-12-11|METHOD AND SYSTEM FOR STORING WAVEFORM DATA.
EP3072110B1|2018-04-04|Method for estimating the movement of an object
Denaro et al.2019|Hybrid Canonical Correlation Analysis and Regression for Radiometric Normalization of Cross-Sensor Satellite Images
EP1426901B1|2006-02-22|Method of detecting point targets and surveillance system applying the method
FR3112228A1|2022-01-07|Device and method for generating a mask of the silhouette of the profile of a structure
FR3073311A1|2019-05-10|METHOD FOR ESTIMATING THE INSTALLATION OF A CAMERA IN THE REFERENTIAL OF A THREE-DIMENSIONAL SCENE, DEVICE, INCREASED REALITY SYSTEM, AND COMPUTER PROGRAM
FR3057690A1|2018-04-20|METHOD OF COMPLETING AN IMAGE OF A SCENE UTILIZING THE SPECTRAL CHARACTERISTICS OF THE SCENE
FR3023924A1|2016-01-22|DETECTOR OF UNDERWATER SHIPS
Mercovich2012|Techniques for automatic large scale change analysis of temporal multispectral imagery
同族专利:
公开号 | 公开日
FR3013878B1|2016-01-01|
IL245871D0|2016-06-30|
EP3074920B1|2018-01-31|
CA2931845A1|2015-06-04|
CA2931845C|2019-06-25|
IL245871A|2018-08-30|
EP3074920A1|2016-10-05|
WO2015078934A1|2015-06-04|
US20170024616A1|2017-01-26|
US10089536B2|2018-10-02|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
FR2982393A1|2011-11-09|2013-05-10|Sagem Defense Securite|SEARCH FOR A TARGET IN A MULTISPECTRAL IMAGE|WO2017012899A1|2015-07-21|2017-01-26|Safran Electronics & Defense|Method for displaying a laser spot|US7627596B2|2001-02-22|2009-12-01|International Business Machines Corporation|Retrieving handwritten documents using multiple document recognizers and techniques allowing both typed and handwritten queries|
US7200243B2|2002-06-28|2007-04-03|The United States Of America As Represented By The Secretary Of The Army|Spectral mixture process conditioned by spatially-smooth partitioning|
US8897571B1|2011-03-31|2014-11-25|Raytheon Company|Detection of targets from hyperspectral imagery|
US8670628B2|2011-08-16|2014-03-11|Raytheon Company|Multiply adaptive spatial spectral exploitation|
US9721319B2|2011-10-14|2017-08-01|Mastercard International Incorporated|Tap and wireless payment methods and devices|
FR3011663B1|2013-10-07|2015-11-13|Sagem Defense Securite|METHOD FOR VISUALIZING A MULTISPECTRAL IMAGE|
FR3013876B1|2013-11-28|2016-01-01|Sagem Defense Securite|ANALYSIS OF A MULTISPECTRAL IMAGE|
FR3013878B1|2013-11-28|2016-01-01|Sagem Defense Securite|ANALYSIS OF A MULTISPECTRAL IMAGE|FR3013878B1|2013-11-28|2016-01-01|Sagem Defense Securite|ANALYSIS OF A MULTISPECTRAL IMAGE|
FR3013876B1|2013-11-28|2016-01-01|Sagem Defense Securite|ANALYSIS OF A MULTISPECTRAL IMAGE|
FR3043823B1|2015-11-12|2017-12-22|Sagem Defense Securite|METHOD FOR DECAMOUFLING AN OBJECT|
WO2017197452A1|2016-05-16|2017-11-23|Sensen Networks Pty Ltd|System and method for automated table game activity recognition|
CN108492279B|2018-02-11|2020-05-05|杭州鸿泉物联网技术股份有限公司|Method and system for detecting on-off state of vehicle tarpaulin|
CN110608676B|2019-08-26|2021-10-26|中国科学院重庆绿色智能技术研究院|Shear displacement measurement method, shear displacement measurement module and multi-parameter combined monitoring system|
CN111915625B|2020-08-13|2021-04-13|湖南省有色地质勘查研究院|Energy integral remote sensing image terrain shadow automatic detection method and system|
法律状态:
2015-10-23| PLFP| Fee payment|Year of fee payment: 3 |
2016-10-24| PLFP| Fee payment|Year of fee payment: 4 |
2017-03-03| CD| Change of name or company name|Owner name: SAGEM DEFENSE SECURITE, FR Effective date: 20170126 |
2017-10-20| PLFP| Fee payment|Year of fee payment: 5 |
2018-10-24| PLFP| Fee payment|Year of fee payment: 6 |
2019-10-22| PLFP| Fee payment|Year of fee payment: 7 |
2021-08-06| ST| Notification of lapse|Effective date: 20210705 |
优先权:
申请号 | 申请日 | 专利标题
FR1361739A|FR3013878B1|2013-11-28|2013-11-28|ANALYSIS OF A MULTISPECTRAL IMAGE|FR1361739A| FR3013878B1|2013-11-28|2013-11-28|ANALYSIS OF A MULTISPECTRAL IMAGE|
CA2931845A| CA2931845C|2013-11-28|2014-11-26|Analysis of a multispectral image|
US15/100,212| US10089536B2|2013-11-28|2014-11-26|Analysis of a multispectral image|
PCT/EP2014/075714| WO2015078934A1|2013-11-28|2014-11-26|Analysis of a multispectral image|
EP14802897.0A| EP3074920B1|2013-11-28|2014-11-26|Analysis of a multispectral image|
IL245871A| IL245871A|2013-11-28|2016-05-26|Analysis of a multispectral image|
[返回顶部]